Skip to content

Fix grammar in LoRA documentation#13423

Open
Xyc2016 wants to merge 1 commit intohuggingface:mainfrom
Xyc2016:fix-lora-grammar
Open

Fix grammar in LoRA documentation#13423
Xyc2016 wants to merge 1 commit intohuggingface:mainfrom
Xyc2016:fix-lora-grammar

Conversation

@Xyc2016
Copy link
Copy Markdown

@Xyc2016 Xyc2016 commented Apr 6, 2026

Summary

  • Fix LoRA'sLoRAs in two places (lines 104 and 106) - plural, not possessive
  • Fix trigger ittrigger them (pronoun agreement: "Some LoRAs" is plural)

Changes

Line 104:

Among adapters, [LoRA's] are the most popular.
→ Among adapters, [LoRAs] are the most popular.

Line 106:

Some LoRA's require a special word to trigger it
→ Some LoRAs require a special word to trigger them

Location

docs/source/en/quicktour.md, lines 104 and 106

Copilot AI review requested due to automatic review settings April 6, 2026 14:42
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the LoRA section of the quick tour documentation to correct pluralization/grammar around trigger words for LoRAs.

Changes:

  • Fixes “Some LoRA's … trigger it” to “Some LoRAs … trigger them” for correct plural/pronoun agreement.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@@ -103,7 +103,7 @@ export_to_video(video, "output.mp4", fps=16)

Adapters insert a small number of trainable parameters to the original base model. Only the inserted parameters are fine-tuned while the rest of the model weights remain frozen. This makes it fast and cheap to fine-tune a model on a new style. Among adapters, [LoRA's](./tutorials/using_peft_for_inference) are the most popular.
Copy link

Copilot AI Apr 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line 104 still uses "LoRA's" as a plural (apostrophe) while line 106 was corrected to "LoRAs". For consistency and correct grammar, update the link text "[LoRA's]" to the plural form (e.g., "[LoRAs]") since this is not possessive here.

Suggested change
Adapters insert a small number of trainable parameters to the original base model. Only the inserted parameters are fine-tuned while the rest of the model weights remain frozen. This makes it fast and cheap to fine-tune a model on a new style. Among adapters, [LoRA's](./tutorials/using_peft_for_inference) are the most popular.
Adapters insert a small number of trainable parameters to the original base model. Only the inserted parameters are fine-tuned while the rest of the model weights remain frozen. This makes it fast and cheap to fine-tune a model on a new style. Among adapters, [LoRAs](./tutorials/using_peft_for_inference) are the most popular.

Copilot uses AI. Check for mistakes.
@Xyc2016 Xyc2016 force-pushed the fix-lora-grammar branch from 4a66ac4 to a67e929 Compare April 6, 2026 15:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants